Factorization Machines Factorized Polynomial Regression Models

نویسندگان

  • Christoph Freudenthaler
  • Lars Schmidt-Thieme
  • Steffen Rendle
چکیده

Factorization Machines (FM) are a new model class that combines the advantages of polynomial regression models with factorization models. Like polynomial regression models, FMs are a general model class working with any real valued feature vector as input for the prediction of real-valued, ordinal or categorical dependent variables as output. However, in contrast to polynomial regression models, FMs replace two-way and all other higher order interaction effects by their factorized analogues. The factorization of higher order interactions enables efficient parameter estimation even for sparse datasets where just a few or even no observations for those higher order effects are available. Polynomial regression models without this factorization fail. This work discusses the relationship of FMs to polynomial regression models and the conceptual difference between factorizing and non-factorizing model parameters of polynomial regression models. Additionally, we show that the model equation of factorized polynomial regression models can be calculated in linear time and thus FMs can be learned efficiently. Apart from polynomial regression models, we also focus on the other origin of FMs: factorization models. We show that the standard factorization models matrix factorization and parallel factor analysis are a special case of FMs and additionally how recently proposed and successfully applied factorization models like SVD++ can be represented by FMs. The drawback of typical factorization models is that they are not applicable for general prediction tasks but work only for categorical input data. Furthermore their model equations and optimization algorithms are derived individually for each task. By knowing that FMs can mimic these models just by choosing the appropriate input data, we finally conclude that deriving a learning algorithm, e.g., Stochastic Gradient Descent (SGD) or Alternating Least Squares (ALS), for FMs once is sufficient to get the learning algorithms for all factorization models automatically, thus saving a lot of time and effort.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strongly Hierarchical Factorization Machines and ANOVA Kernel Regression

High-order parametric models that include terms for feature interactions are applied to various data mining tasks, where ground truth depends on interactions of features. However, with sparse data, the highdimensional parameters for feature interactions often face three issues: expensive computation, difficulty in parameter estimation and lack of structure. Previous work has proposed approaches...

متن کامل

Polynomial Networks and Factorization Machines: New Insights and Efficient Training Algorithms

Polynomial networks and factorization machines are two recently-proposed models that can efficiently use feature interactions in classification and regression tasks. In this paper, we revisit both models from a unified perspective. Based on this new view, we study the properties of both models and propose new efficient training algorithms. Key to our approach is to cast parameter learning as a ...

متن کامل

In-Database Factorized Learning

In this paper, we overview recent contributions on in-database analytics for a class of optimization problems that are important for LogicBlox retail-planning and forecasting applications [4, 5, 7]. The class includes ridge linear regression, polynomial regression, factorization machines, principal component analysis and classification models. Such problems are typically computed over input dat...

متن کامل

Multi-output Polynomial Networks and Factorization Machines

Factorization machines and polynomial networks are supervised polynomial models based on an efficient low-rank decomposition. We extend these models to the multioutput setting, i.e., for learning vector-valued functions, with application to multiclass or multi-task problems. We cast this as the problem of learning a 3-way tensor whose slices share a common decomposition and propose a convex for...

متن کامل

Factorization of multivariate positive Laurent polynomials

Recently Dritschel proves that any positive multivariate Laurent polynomial can be factorized into a sum of square magnitudes of polynomials. We first give another proof of the Dritschel theorem. Our proof is based on the univariate matrix Féjer-Riesz theorem. Then we discuss a computational method to find approximates of polynomial matrix factorization. Some numerical examples will be shown. F...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011